878 research outputs found

    Analyzing and Visualizing Cosmological Simulations with ParaView

    Full text link
    The advent of large cosmological sky surveys - ushering in the era of precision cosmology - has been accompanied by ever larger cosmological simulations. The analysis of these simulations, which currently encompass tens of billions of particles and up to trillion particles in the near future, is often as daunting as carrying out the simulations in the first place. Therefore, the development of very efficient analysis tools combining qualitative and quantitative capabilities is a matter of some urgency. In this paper we introduce new analysis features implemented within ParaView, a parallel, open-source visualization toolkit, to analyze large N-body simulations. The new features include particle readers and a very efficient halo finder which identifies friends-of-friends halos and determines common halo properties. In combination with many other functionalities already existing within ParaView, such as histogram routines or interfaces to Python, this enhanced version enables fast, interactive, and convenient analyses of large cosmological simulations. In addition, development paths are available for future extensions.Comment: 9 pages, 8 figure

    Impact-induced devolatilization and hydrogen isotopic fractionation of serpentine: Implications for planetary accretion

    Get PDF
    Impact-induced devolatilization of porous serpentine was investigated using two independent experimental methods, the gas recovery and the solid recovery method, each yielding nearly identical results. For shock pressures near incipient devolatilization, the hydrogen isotopic composition of the evolved H2O is very close to that of the starting material. For shock pressures at which up to 12 percent impact-induced devolatilization occurs, the bulk evolved gas is significantly lower in deuterium than the starting material. There is also significant reduction of H2O to H2 in gases recovered at these higher shock pressures, probably caused by reaction of evolved H2O with the metal gas recovery fixture. Gaseous H2O-H2 isotopic fractionation suggests high temperature isotopic equilibrium between the gaseous species, indicating initiation of devolatilization at sites of greater than average energy deposition. Bulk gas-residual solid isotopic fractionations indicate nonequilibrium, kinetic control of gas-solid isotopic ratios. Impact-induced hydrogen isotopic fractionation of hydrous silicates during accretion can strongly affect the long-term planetary isotopic ratios of planetary bodies, leaving the interiors enriched in deuterium. Depending on the model used for extrapolation of the isotopic fractionation to devolatilization fractions greater than those investigated experimentally can result from this process

    Dynamic compression and volatile release of carbonates

    Get PDF
    Particle velocity profiles upon shock compression and adiabatic release were measured for polycrystalline calcite (Solenhofen limestone) to 12–24 GPa and for porous calcite (Dover chalk, ρ_o = 1.40 g/cm^3, 49% porosity) to between 5 and 11 GPa. The electromagnetic particle velocity gauge method was used. Upon shock compression of Solenhofen limestone, the Hugoniot elastic limit was determined to vary from 0.36 to 0.45 GPa. Transition shocks at between 2.5 and 3.7 GPa, possibly arising from the calcite II-III transition, were observed. For the Solenhofen limestone, the release paths lie relatively close to the Hugoniot. Evidence for the occurrence of the calcite III-II transition upon release was observed, but no rarefaction shocks were detected. Initial release wave speeds suggest retention of shear strength up to at least 20 GPa, with a possible loss of shear strength at higher pressures. The measured equation of state is used to predict the fraction of material devolatilized upon adiabatic release as a function of shock pressure. The effect of ambient partial pressure of CO_2 on the calculations is demonstrated. P_(CO_2) should be taken into account in models of atmospheric evolution by means of impact-induced mineral devolatilization. Mass fractions of CO_2 released expected on the basis of a continuum model are much lower than determined experimentally. This discrepancy, and radiative characteristics of shocked calcite, indicate that localization of thermal energy (shear banding) occurs under shock compression even though no solid-solid transitions occur in this pressure range. Release adiabatic data indicate that Dover chalk loses its shear strength when shocked to 10 GPa pressure. At 5 GPa the present data are ambiguous regarding shear strength. For Dover chalk, continuum shock entropy calculations result in a minimum estimate of 90% devolatilization upon complete release from 10 GPa. For calcite, isentropic release paths from calculated continuum Hugoniot temperatures cross into the CaO (solid) + CO_2 (vapor) field at improbably low pressures (for example, 10 GPa for a shock pressure of 25 GPa). However, calculated isentropic release paths originating from PT points corresponding to previous color temperature under shock measurements cross into the melt plus vapor field at pressures greater than 0.5 GPa, suggesting that devolatilization is initiated at the shear banding sites

    Multivariate Pointwise Information-Driven Data Sampling and Visualization

    Full text link
    With increasing computing capabilities of modern supercomputers, the size of the data generated from the scientific simulations is growing rapidly. As a result, application scientists need effective data summarization techniques that can reduce large-scale multivariate spatiotemporal data sets while preserving the important data properties so that the reduced data can answer domain-specific queries involving multiple variables with sufficient accuracy. While analyzing complex scientific events, domain experts often analyze and visualize two or more variables together to obtain a better understanding of the characteristics of the data features. Therefore, data summarization techniques are required to analyze multi-variable relationships in detail and then perform data reduction such that the important features involving multiple variables are preserved in the reduced data. To achieve this, in this work, we propose a data sub-sampling algorithm for performing statistical data summarization that leverages pointwise information theoretic measures to quantify the statistical association of data points considering multiple variables and generates a sub-sampled data that preserves the statistical association among multi-variables. Using such reduced sampled data, we show that multivariate feature query and analysis can be done effectively. The efficacy of the proposed multivariate association driven sampling algorithm is presented by applying it on several scientific data sets.Comment: 25 page

    Reducing Occlusion in Cinema Databases through Feature-Centric Visualizations

    Get PDF
    In modern supercomputer architectures, the I/O capabilities do not keep up with the computational speed. Image-based techniques are one very promising approach to a scalable output format for visual analysis, in which a reduced output that corresponds to the visible state of the simulation is rendered in-situ and stored to disk. These techniques can support interactive exploration of the data through image compositing and other methods, but automatic methods of highlighting data and reducing clutter can make these methods more effective. In this paper, we suggest a method of assisted exploration through the combination of feature-centric analysis with image space techniques and show how the reduction of the data to features of interest reduces occlusion in the output for a set of example applications

    MolSieve: A Progressive Visual Analytics System for Molecular Dynamics Simulations

    Full text link
    Molecular Dynamics (MD) simulations are ubiquitous in cutting-edge physio-chemical research. They provide critical insights into how a physical system evolves over time given a model of interatomic interactions. Understanding a system's evolution is key to selecting the best candidates for new drugs, materials for manufacturing, and countless other practical applications. With today's technology, these simulations can encompass millions of unit transitions between discrete molecular structures, spanning up to several milliseconds of real time. Attempting to perform a brute-force analysis with data-sets of this size is not only computationally impractical, but would not shed light on the physically-relevant features of the data. Moreover, there is a need to analyze simulation ensembles in order to compare similar processes in differing environments. These problems call for an approach that is analytically transparent, computationally efficient, and flexible enough to handle the variety found in materials based research. In order to address these problems, we introduce MolSieve, a progressive visual analytics system that enables the comparison of multiple long-duration simulations. Using MolSieve, analysts are able to quickly identify and compare regions of interest within immense simulations through its combination of control charts, data-reduction techniques, and highly informative visual components. A simple programming interface is provided which allows experts to fit MolSieve to their needs. To demonstrate the efficacy of our approach, we present two case studies of MolSieve and report on findings from domain collaborators.Comment: Updated references to GPCC

    Optimizing Error-Bounded Lossy Compression for Three-Dimensional Adaptive Mesh Refinement Simulations

    Full text link
    Today's scientific simulations require a significant reduction of data volume because of extremely large amounts of data they produce and the limited I/O bandwidth and storage space. Error-bounded lossy compression has been considered one of the most effective solutions to the above problem. However, little work has been done to improve error-bounded lossy compression for Adaptive Mesh Refinement (AMR) simulation data. Unlike the previous work that only leverages 1D compression, in this work, we propose to leverage high-dimensional (e.g., 3D) compression for each refinement level of AMR data. To remove the data redundancy across different levels, we propose three pre-process strategies and adaptively use them based on the data characteristics. Experiments on seven AMR datasets from a real-world large-scale AMR simulation demonstrate that our proposed approach can improve the compression ratio by up to 3.3X under the same data distortion, compared to the state-of-the-art method. In addition, we leverage the flexibility of our approach to tune the error bound for each level, which achieves much lower data distortion on two application-specific metrics.Comment: 13 pages, 17 figures, 3 tables, accepted by ACM HPDC 202

    Cinema Darkroom: A Deferred Rendering Framework for Large-Scale Datasets

    Full text link
    This paper presents a framework that fully leverages the advantages of a deferred rendering approach for the interactive visualization of large-scale datasets. Geometry buffers (G-Buffers) are generated and stored in situ, and shading is performed post hoc in an interactive image-based rendering front end. This decoupled framework has two major advantages. First, the G-Buffers only need to be computed and stored once---which corresponds to the most expensive part of the rendering pipeline. Second, the stored G-Buffers can later be consumed in an image-based rendering front end that enables users to interactively adjust various visualization parameters---such as the applied color map or the strength of ambient occlusion---where suitable choices are often not known a priori. This paper demonstrates the use of Cinema Darkroom on several real-world datasets, highlighting CD's ability to effectively decouple the complexity and size of the dataset from its visualization
    corecore